Nonnegative autoencoder with simplified random neural network

نویسندگان

  • Yonghua Yin
  • Erol Gelenbe
چکیده

This paper proposes new nonnegative (shallow and multi-layer) autoencoders by combining the spiking Random Neural Network (RNN) model, the network architecture typical used in deep-learning area and the training technique inspired from nonnegative matrix factorization (NMF). The shallow autoencoder is a simplified RNN model, which is then stacked into a multi-layer architecture. The learning algorithm is based on the weight update rules in NMF, subject to the nonnegative probability constraints of the RNN. The autoencoders equipped with this learning algorithm are tested on typical image datasets including the MNIST, Yale face and CIFAR-10 datasets, and also using 16 real-world datasets from different areas. The results obtained through these tests yield the desired high learning and recognition accuracy. Also, numerical simulations of the stochastic spiking behavior of this RNN auto encoder, show that it can be implemented in a highly-distributed manner.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Efficient online learning of a non-negative sparse autoencoder

We introduce an efficient online learning mechanism for nonnegative sparse coding in autoencoder neural networks. In this paper we compare the novel method to the batch algorithm non-negative matrix factorization with and without sparseness constraint. We show that the efficient autoencoder yields to better sparseness and lower reconstruction errors than the batch algorithms on the MNIST benchm...

متن کامل

Alternating Optimization Method Based on Nonnegative Matrix Factorizations for Deep Neural Networks

The backpropagation algorithm for calculating gradients has been widely used in computation of weights for deep neural networks (DNNs). This method requires derivatives of objective functions and has some difficulties finding appropriate parameters such as learning rate. In this paper, we propose a novel approach for computing weight matrices of fully-connected DNNs by using two types of semi-n...

متن کامل

Simplified Seismic Dynamic Analysis of Sloshing Phenomenon in Rectangular Tanks with Multiple Vertical Baffles

Sloshing is a well-known phenomenon in liquid storage tanks subjected to base or body motions. In recent years the use of multiple vertical baffles for reducing the sloshing effects in tanks subjected to earthquake has not been taken into consideration so much. On the other hand, although some of the existing computer programs are capable to model sloshing phenomenon with acceptable accuracy, t...

متن کامل

A Recurrent Latent Variable Model for Sequential Data

In this paper, we explore the inclusion of latent random variables into the hidden state of a recurrent neural network (RNN) by combining the elements of the variational autoencoder. We argue that through the use of high-level latent random variables, the variational RNN (VRNN)1 can model the kind of variability observed in highly structured sequential data such as natural speech. We empiricall...

متن کامل

A Deep Neural Network Architecture Using Dimensionality Reduction with Sparse Matrices

We present a new deep neural network architecture, motivated by sparse random matrix theory that uses a low-complexity embedding through a sparse matrix instead of a conventional stacked autoencoder. We regard autoencoders as an information-preserving dimensionality reduction method, similar to random projections in compressed sensing. Thus, exploiting recent theory on sparse matrices for dimen...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:
  • CoRR

دوره abs/1609.08151  شماره 

صفحات  -

تاریخ انتشار 2016